# Chinese Pretraining
Minirbt H288
Apache-2.0
MiniRBT is a Chinese small pretrained model developed based on knowledge distillation technology, optimized for training efficiency using Whole Word Masking.
Large Language Model
Transformers Chinese

M
hfl
405
8
Chinese Roberta L 8 H 256
A Chinese RoBERTa model pretrained on CLUECorpusSmall, with 8 layers and 512 hidden units, suitable for various Chinese NLP tasks.
Large Language Model Chinese
C
uer
15
1
T5 Small Chinese Cluecorpussmall
A small Chinese T5 model pretrained using the UER-py framework, adopting a unified text-to-text format to handle various Chinese NLP tasks
Large Language Model Chinese
T
uer
1,336
19
Featured Recommended AI Models